287 research outputs found

    Bootstrapping for penalized spline regression.

    Get PDF
    We describe and contrast several different bootstrapping procedures for penalized spline smoothers. The bootstrapping procedures considered are variations on existing methods, developed under two different probabilistic frameworks. Under the first framework, penalized spline regression is considered an estimation technique to find an unknown smooth function. The smooth function is represented in a high dimensional spline basis, with spline coefficients estimated in a penalized form. Under the second framework, the unknown function is treated as a realization of a set of random spline coefficients, which are then predicted in a linear mixed model. We describe how bootstrapping methods can be implemented under both frameworks, and we show in theory and through simulations and examples that bootstrapping provides valid inference in both cases. We compare the inference obtained under both frameworks, and conclude that the latter generally produces better results than the former. The bootstrapping ideas are extended to hypothesis testing, where parametric components in a model are tested against nonparametric alternatives.Methods; Framework; Regression; Linear mixed model; Mixed model; Model; Theory; Simulation; Hypothesis testing;

    A sharp look at the gravitationally lensed quasar SDSS J0806+2006 with Laser Guide Star Adaptive Optics

    Full text link
    We present the first VLT near-IR observations of a gravitationally lensed quasar, using adaptive optics and laser guide star. These observations can be considered as a test bench for future systematic observations of lensed quasars with adaptive optics, even when bright natural guide stars are not available in the nearby field. With only 14 minutes of observing time, we derived very accurate astrometry of the quasar images and of the lensing galaxy, with 0.05 \arcsec spatial resolution, comparable to the Hubble Space Telescope (HST). In combination with deep VLT optical spectra of the quasar images, we use our adaptive optics images to constrain simple models for the mass distribution of the lensing galaxy. The latter is almost circular and does not need any strong external shear to fit the data. The time delay predicted for SDSS0806+2006, assuming a singular isothermal ellipsoid model and the concordance cosmology, is Delta t \simeq 50 days. Our optical spectra indicate a flux ratio between the quasar images of A/B=1.3 in the continuum and A/B=2.2 in both the MgII and in the CIII] broad emission lines. This suggests that microlensing affects the continuum emission. However, the constant ratio between the two emission lines indicates that the broad emission line region is not microlensed. Finally, we see no evidence of reddening by dust in the lensing galaxy.Comment: 4 pages, Published in Astronomy and Astrophysics. Discussion slightly expanded with respect to v1. Typos correcte

    Characterizing Entanglement Sources

    Get PDF
    We discuss how to characterize entanglement sources with finite sets of measurements. The measurements do not have to be tomographically complete, and may consist of POVMs rather than von Neumann measurements. Our method yields a probability that the source generates an entangled state as well as estimates of any desired calculable entanglement measures, including their error bars. We apply two criteria, namely Akaike's information criterion and the Bayesian information criterion, to compare and assess different models (with different numbers of parameters) describing entanglement-generating devices. We discuss differences between standard entanglement-verificaton methods and our present method of characterizing an entanglement source.Comment: This submission, together with the next one, supersedes arXiv:0806.416

    Robust benchmark dose determination based on profile score methods.

    Get PDF
    We investigate several methods commonly used to obtain a benchmark dose and show that those based on full likelihood or profile likelihood methods might have severe shortcomings. We propose two new profile likelihood-based approaches which overcome these problems. Another contribution is the extension of the benchmark dose determination to non full likelihood models, such as quasi-likelihood, generalized estimating equations, which are widely used in settings such as developmental toxicity where clustered data are encountered. This widening of the scope of application is possible by the use of (robust) score statistics. Benchmark dose methods are applied to a data set from a developmental toxicity study.clustered binary data; generalized estimating equations; likelihood ratio; profile likelihood; score statistic; toxicology; clustered binary data; quantitative risk assessment; longitudinal data-analysis; generalized linear-models; developmental toxicity; likelihood; tests; misspecification; outcomes;

    A power-law distribution of phase-locking intervals does not imply critical interaction

    Get PDF
    Neural synchronisation plays a critical role in information processing, storage and transmission. Characterising the pattern of synchronisation is therefore of great interest. It has recently been suggested that the brain displays broadband criticality based on two measures of synchronisation - phase locking intervals and global lability of synchronisation - showing power law statistics at the critical threshold in a classical model of synchronisation. In this paper, we provide evidence that, within the limits of the model selection approach used to ascertain the presence of power law statistics, the pooling of pairwise phase-locking intervals from a non-critically interacting system can produce a distribution that is similarly assessed as being power law. In contrast, the global lability of synchronisation measure is shown to better discriminate critical from non critical interaction.Comment: (v3) Fixed error in Figure 1; (v2) Added references. Minor edits throughout. Clarified relationship between theoretical critical coupling for infinite size system and 'effective' critical coupling system for finite size system. Improved presentation and discussion of results; results unchanged. Revised Figure 1 to include error bars on r and N; results unchanged; (v1) 11 pages, 7 figure

    Heat transport in insulators from ab initio Green-Kubo theory

    Full text link
    The Green-Kubo theory of thermal transport has long be considered incompatible with modern simulation methods based on electronic-structure theory, because it is based on such concepts as energy density and current, which are ill-defined at the quantum-mechanical level. Besides, experience with classical simulations indicates that the estimate of heat-transport coefficients requires analysing molecular trajectories that are more than one order of magnitude longer than deemed feasible using ab initio molecular dynamics. In this paper we report on recent theoretical advances that are allowing one to overcome these two obstacles. First, a general gauge invariance principle has been established, stating that thermal conductivity is insensitive to many details of the microscopic expression for the energy density and current from which it is derived, thus permitting to establish a rigorous expression for the energy flux from Density-Functional Theory, from which the conductivity can be computed in practice. Second, a novel data analysis method based on the statistical theory of time series has been proposed, which allows one to considerably reduce the simulation time required to achieve a target accuracy on the computed conductivity. These concepts are illustrated in detail, starting from a pedagogical introduction to the Green-Kubo theory of linear response and transport, and demonstrated with a few applications done with both classical and quantum-mechanical simulation methods.Comment: 36 pages, 14 figure

    Minimax optimal procedures for testing the structure of multidimensional functions

    Get PDF
    We present a novel method for detecting some structural characteristics of multidimensional functions. We consider the multidimensional Gaussian white noise model with an anisotropic estimand. Using the relation between the Sobol decomposition and the geometry of multidimensional wavelet basis we can build test statistics for any of the Sobol functional components. We assess the asymptotical minimax optimality of these test statistics and show that they are optimal in presence of anisotropy with respect to the newly determined minimax rates of separation. An appropriate combination of these test statistics allows to test some general structural characteristics such as the atomic dimension or the presence of some variables. Numerical experiments show the potential of our method for studying spatio-temporal processes.G. Claeskens and J.-M. Freyermuth acknowledge the support of the Fund for Scientific Research Flanders, KU Leuven grant GOA/12/14 and of the IAP Research Network P7/06 of the Belgian Science Policy. Jean-Marc Freyermuth's and John Aston's research was supported by the Engineering and Physical Sciences Research Council [EP/K021672/2]
    • …
    corecore